Orthogonal Distance Regression

نویسندگان

  • Paul T. Boggs
  • Janet E. Rogers
چکیده

Orthogonal Distance Regresson (ODR) is the name given to the computational problem associated with finding the maximum likelihood estimators of parameters in measurement error models in the case of normally distributed errors. We examine the stable and efficient algorithm of Boggs, Byrd and Schnabel (SIAM J. Sci. Stat. Comput., 8 (1987), pp. 1052– 1078) for finding the solution of this problem when the underlying model is assumed to be nonlinear in both the independent variable and the parameters. We also describe the associated public domain software package, ODRPACK. We then review the results of a simulation study that compares ODR with ordinary least squares (OLS). We also present the new results of an extension to this study. Finally we discuss the use of the asymptotic covariance matrix for computing confidence regions and intervals for the estimated parameters. Our conclusions are that ODR is better than OLS for the criteria considered, and that ODRPACK can provide effective solutions and useful statistical information for nonlinear ODR problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Data Depth for Classical and Orthogonal Regression

We present a comparison of different depth notions which are appropriate for classical and orthogonal regression with and without intercept. We consider the global depth and tangential depth introduced by Mizera (2002) and the simplicial depth studied for regression in detail at first by Müller (2005). The global depth and the tangential depth are based on quality functions. These quality funct...

متن کامل

Orthogonal linear regression algorithm based on augmented matrix formulation

Scope and Purpose : In this paper, a new technique for solving a multivariate linear model using the orthogonal least absolute values regression is proposed. The orthogonal least absolute values (ORLAV) regression minimises the sum of the absolute, orthogonal distance from each data point to the resulting regression hyperplane. In a large set of equations where the variables are independent of ...

متن کامل

Orthogonal Projection in Teaching Regression and Financial Mathematics

Two improvements in teaching linear regression are suggested. The first is to include the population regression model at the beginning of the topic. The second is to use a geometric approach: to interpret the regression estimate as an orthogonal projection and the estimation error as the distance (which is minimized by the projection). Linear regression in finance is described as an example of ...

متن کامل

Fitting two concentric spheres to data by orthogonal distance regression

The problem of this research tackles the process of fitting two concentric spheres to data, which arises in computational metrology. There are also many fitting criteria that could be used effectively, and the most widely used one in metrology, for example, is that of the sum of squared minimal distance. However, a simple and robust algorithm assigned for using the orthogonal distance regressio...

متن کامل

Orthogonal Nonlinear Least-Squares Regression in R

Orthogonal nonlinear least squares (ONLS) regression is a not so frequently applied and largely overlooked regression technique that comes into question when one encounters an ”error in variables” problem. While classical nonlinear least squares (NLS) aims to minimize the sum of squared vertical residuals, ONLS minimizes the sum of squared orthogonal residuals. The method is based on finding po...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009